Tensor Methods for Large, Sparse Nonlinear Least Squares Problems
نویسندگان
چکیده
This paper introduces tensor methods for solving large, sparse nonlinear least squares problems where the Jacobian either is analytically available or is computed by nite diier-ence approximations. Tensor methods have been shown to have very good computational performance for small to medium-sized, dense nonlinear least squares problems. In this paper we consider the application of tensor methods to large, sparse nonlinear least squares problems. This involves an entirely new way of solving the tensor model that is eecient for sparse problems. A number of interesting linear algebraic implementation issues are addressed. The test results of the tensor method applied to a set of sparse nonlinear least squares problems compared with those of the standard Gauss-Newton method reveal that the tensor method is signiicantly more robust and eecient than the standard Gauss-Newton method.
منابع مشابه
A Nonlinear GMRES Optimization Algorithm for Canonical Tensor Decomposition
A new algorithm is presented for computing a canonical rank-R tensor approximation that has minimal distance to a given tensor in the Frobenius norm, where the canonical rank-R tensor consists of the sum of R rank-one tensors. Each iteration of the method consists of three steps. In the first step, a tentative new iterate is generated by a stand-alone one-step process, for which we use alternat...
متن کاملParallel Tensor Methods for Nonlinear Equations and Nonlinear Least Squares
We describe the design and computational performance of parallel row-oriented tensor algorithms for the solution of dense systems of nonlinear equations and nonlinear least squares problems on a distributed-memory MIMD multiprocessor. Tensor methods are general purpose methods that base each iteration upon a quadratic model of the nonlinear function, rather than the standard linear model, where...
متن کاملFast and Unique Tucker Decompositions via Multiway Blind Source Separation
A multiway blind source separation (MBSS) method is developed to decompose large-scale tensor (multiway array) data. Benefitting from all kinds of well-established constrained low-rank matrix factorization methods, MBSS is quite flexible and able to extract unique and interpretable components with physical meaning. The multilinear structure of Tucker and the essential uniqueness of BSS methods ...
متن کاملSuperlinearly convergent exact penalty projected structured Hessian updating schemes for constrained nonlinear least squares: asymptotic analysis
We present a structured algorithm for solving constrained nonlinear least squares problems, and establish its local two-step Q-superlinear convergence. The approach is based on an adaptive structured scheme due to Mahdavi-Amiri and Bartels of the exact penalty method of Coleman and Conn for nonlinearly constrained optimization problems. The structured adaptation also makes use of the ideas of N...
متن کاملInexact trust region method for large sparse nonlinear least squares
The main purpose of this paper is to show that linear least squares methods based on bidiagonalization, namely the LSQR algorithm, can be used for generation of trust region path. This property is a basis for an inexact trust region method which uses the LSQR algorithm for direction determination. This method is very efficient for large sparse nonlinear least squares as it is supported by numer...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM J. Scientific Computing
دوره 21 شماره
صفحات -
تاریخ انتشار 1999